N-bit parity neural networks: new solutions based on linear programming

نویسندگان

  • Derong Liu
  • Myron E. Hohil
  • Stanley H. Smith
چکیده

In this paper, the N -bit parity problem is solved with a neural network that allows direct connections between the input layer and the output layer. The activation function used in the hidden and output layer neurons is the threshold function. It is shown that this choice of activation function and network structure leads to several solutions for the 3-bit parity problem using linear programming. One of the solutions for the 3-bit parity problem is then generalized to obtain a solution for the N -bit parity problem using N=2 hidden layer neurons. Compared to other existing solutions in the literature, the present solution is more systematic and simpler. Furthermore, the present solution can be simpli7ed by using a single hidden layer neuron with a “staircase” type activation function instead of N=2 hidden layer neurons. The present activation function is easier to implement in hardware than those in the literature for N -bit parity networks. We also review similarities and di;erences between the present results and those obtained in the literature. c © 2002 Elsevier Science B.V. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Solving Linear Semi-Infinite Programming Problems Using Recurrent Neural Networks

‎Linear semi-infinite programming problem is an important class of optimization problems which deals with infinite constraints‎. ‎In this paper‎, ‎to solve this problem‎, ‎we combine a discretization method and a neural network method‎. ‎By a simple discretization of the infinite constraints,we convert the linear semi-infinite programming problem into linear programming problem‎. ‎Then‎, ‎we use...

متن کامل

A Recurrent Neural Network Model for Solving Linear Semidefinite Programming

In this paper we solve a wide rang of Semidefinite Programming (SDP) Problem by using Recurrent Neural Networks (RNNs). SDP is an important numerical tool for analysis and synthesis in systems and control theory. First we reformulate the problem to a linear programming problem, second we reformulate it to a first order system of ordinary differential equations. Then a recurrent neural network...

متن کامل

A Constructive Algorithm for Feedforward Neural Networks With Incremental Training

We develop, in this brief, a new constructive learning algorithm for feedforward neural networks. We employ an incremental training procedure where training patterns are learned one by one. Our algorithm starts with a single training pattern and a single hidden-layer neuron. During the course of neural network training, when the algorithm gets stuck in a local minimum, we will attempt to escape...

متن کامل

Minimization of Parity-Checked Fault-Secure AND/EXOR Networks

We present a new algorithm for the eecient realization of fault-secure and totally self-checking circuits for multi-output boolean functions using parity-checked AND/EXOR networks. First results on the minimization of a large set of typical arithmetic and random functions and several MCNC benchmark circuits show that AND/EXOR networks often allow much better solutions than traditional AND/OR ne...

متن کامل

Evolving Neural Networks with Collaborative Species

We present a coevolutionary architecture for solving decomposable problems and apply it to the evolution of artificial neural networks. Although this work is preliminary in nature it has a number of advantages over non-coevolutionary approaches. The coevolutionary approach utilizes a divide-and-conquer technique in which species representing simpler subtasks are evolved in separate instances of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 48  شماره 

صفحات  -

تاریخ انتشار 2002